According to original Shannon"s concept entropy is the measure assigned to the spectrum of possible states of a given system. A cognitive subject can reduce his/her uncertainty through gaining some information. Incoming information can change the subject"s knowledge in two ways: it may either eliminate some possible states or it can change the way the subject "partitions" the world. The former situation corresponds to gaining some knowledge in the way of empirical examining the world – the result of an experiment reduces our uncertainty of the world by selecting of the whole spectrum of expectable answers to the "experimental question� those which may be the most probable. We claim that the expectable answers are not declarations stated...
Given a probability space, we analyze the uncertainty, that is, the amount of information of a finit...
Information plays an important role in our understanding of the physical world. Hence we propose an ...
Shannon's notion of relative information between two physical systems can function as foundation fo...
We give a survey of the basic statistical ideas underlying the definition of entropy in information...
Searching for information is critical in many situations. In medicine, for instance, careful choice ...
Shannon's famous paper [1] paved the way to a theory called information theory. In essence, the...
Shannon entropy is often considered as a measure of uncertainty. It is commonly believed that entro...
A variety of conceptualizations of psychological uncertaintyexist. From an information-theoretic per...
Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s I...
Measures of entropy are useful for explaining the behaviour of cognitive models. We demonstrate tha...
Shannon’s entropy is calculated using probabilities P(i) i.e. S= - Sum over i P(i) ln(P(i)). A proba...
The expression for entropy sometimes appears mysterious – as it often is asserted without justificat...
This paper reveals errors within Norwich et al.’s Entropy Theory of Perception, errors that have bro...
Since its evolution, the concept of Entropy has been applied in various fields like Computer Science...
Learning is an important process that allows us to reduce the uncertainty of the outcomes of our dec...
Given a probability space, we analyze the uncertainty, that is, the amount of information of a finit...
Information plays an important role in our understanding of the physical world. Hence we propose an ...
Shannon's notion of relative information between two physical systems can function as foundation fo...
We give a survey of the basic statistical ideas underlying the definition of entropy in information...
Searching for information is critical in many situations. In medicine, for instance, careful choice ...
Shannon's famous paper [1] paved the way to a theory called information theory. In essence, the...
Shannon entropy is often considered as a measure of uncertainty. It is commonly believed that entro...
A variety of conceptualizations of psychological uncertaintyexist. From an information-theoretic per...
Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon’s I...
Measures of entropy are useful for explaining the behaviour of cognitive models. We demonstrate tha...
Shannon’s entropy is calculated using probabilities P(i) i.e. S= - Sum over i P(i) ln(P(i)). A proba...
The expression for entropy sometimes appears mysterious – as it often is asserted without justificat...
This paper reveals errors within Norwich et al.’s Entropy Theory of Perception, errors that have bro...
Since its evolution, the concept of Entropy has been applied in various fields like Computer Science...
Learning is an important process that allows us to reduce the uncertainty of the outcomes of our dec...
Given a probability space, we analyze the uncertainty, that is, the amount of information of a finit...
Information plays an important role in our understanding of the physical world. Hence we propose an ...
Shannon's notion of relative information between two physical systems can function as foundation fo...